Strong Unimodality and Exact Learning of Constant Depth µ-Perceptron Networks
نویسندگان
چکیده
We present a statistical method that exactly learns the class of constant depth μ-perceptron networks with weights taken from {−1, 0 + 1} and arbitrary thresholds when the distribution that generates the input examples is member of the family of product distributions. These networks (also known as nonoverlapping perceptron networks or read-once formulas over a weighted threshold basis) are loop-free neural nets in which each node has only one outgoing weight. With arbitrary high probability, the learner is able to exactly identify the connectivity (or skeleton) of the target μ-perceptron network by using a new statistical test which exploits the strong unimodality property of sums of independent random variables.
منابع مشابه
4 . Multilayer perceptrons and back - propagation
Multilayer feed-forward networks, or multilayer perceptrons (MLPs) have one or several " hidden " layers of nodes. This implies that they have two or more layers of weights. The limitations of simple perceptrons do not apply to MLPs. In fact, as we will see later, a network with just one hidden layer can represent any Boolean function (including the XOR which is, as we saw, not linearly separab...
متن کاملParisi Phase in a Neuron
Pattern storage by a single neuron is revisited. Generalizing Parisi's framework for spin glasses we obtain a variational free energy functional for the neuron. The solution is demonstrated at high temperature and large relative number of examples , where several phases are identified by thermodynam-ical stability analysis, two of them exhibiting spontaneous full replica symmetry breaking. We g...
متن کاملOn Learning µ-Perceptron Networks with Binary Weights
Neural networks with binary weights are very important from both the theoretical and practical points of view. In this paper, we investigate the learnability of single binary perceptrons and unions of μ-binary-perceptron networks, i.e. an “OR” of binary perceptrons where each input unit is connected to one and only one perceptron. We give a polynomial time algorithm that PAC learns these networ...
متن کاملPolyhedrons and Perceptrons Are Functionally Equivalent
Mathematical definitions of polyhedrons and perceptron networks are discussed. The formalization of polyhedrons is done in a rather traditional way. For networks, previously proposed systems are developed. Perceptron networks in disjunctive normal form (DNF) and conjunctive normal forms (CNF) are introduced. The main theme is that single output perceptron neural networks and characteristic func...
متن کاملIdenti cation Criteria and Lower Bounds for Perceptron-like Learning Rules
Perceptron-like learning rules are known to require exponentially many correction steps in order to identify Boolean threshold functions exactly. We introduce criteria that are weaker than exact identiication and investigate whether learning becomes signiicantly faster if exact identiication is replaced by one of these criteria: PAC identiication, order identiication, and sign iden-tiication. P...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 1995